Bootstrap-Based Regularization for Low-Rank Matrix Estimation
نویسندگان
چکیده
We develop a flexible framework for low-rank matrix estimation that allows us to transform noise models into regularization schemes via a simple bootstrap algorithm. Effectively, our procedure seeks an autoencoding basis for the observed matrix that is stable with respect to the specified noise model; we call the resulting procedure a stable autoencoder. In the simplest case, with an isotropic noise model, our method is equivalent to a classical singular value shrinkage estimator. For non-isotropic noise models—e.g., Poisson noise— the method does not reduce to singular value shrinkage, and instead yields new estimators that perform well in experiments. Moreover, by iterating our stable autoencoding scheme, we can automatically generate low-rank estimates without specifying the target rank as a tuning parameter.
منابع مشابه
Rank-One Matrix Completion with Automatic Rank Estimation via L1-Norm Regularization
Completing a matrix from a small subset of its entries, i.e., matrix completion, is a challenging problem arising from many real-world applications, such as machine learning and computer vision. One popular approach to solving the matrix completion problem is based on low-rank decomposition/factorization. Low-rank matrix decomposition-based methods often require a pre-specified rank, which is d...
متن کاملSpeaker adaptation based on sparse and low-rank eigenphone matrix estimation
The eigenphone based speaker adaptation outperforms the conventional MLLR and eigenvoice methods when the adaptation data is sufficient, but it suffers from severe over-fitting when the adaptation data is limited. In this paper, l1 and nuclear norm regularization are applied simultaneously to obtain a more robust eigenphone estimation, resulting in a sparse and low-rank eigenphone matrix. The s...
متن کاملEstimation of (near) low-rank matrices with noise and high-dimensional scaling
We study an instance of high-dimensional inference in which the goal is to estimate a matrix Θ ∈ R12 on the basis of N noisy observations. The unknown matrix Θ is assumed to be either exactly low rank, or “near” low-rank, meaning that it can be wellapproximated by a matrix with low rank. We consider a standard M -estimator based on regularization by the nuclear or trace norm over matrices, and ...
متن کاملLarge-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation
In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...
متن کاملA Variational Approach for Sparse Component Estimation and Low-Rank Matrix Recovery
We propose a variational Bayesian based algorithm for the estimation of the sparse component of an outliercorrupted low-rank matrix, when linearly transformed composite data are observed. The model constitutes a generalization of robust principal component analysis. The problem considered herein is applicable in various practical scenarios, such as foreground detection in blurred and noisy vide...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 17 شماره
صفحات -
تاریخ انتشار 2016